Perceptual reversals need no prompting by attention

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Perceptual reversals need no prompting by attention.

Many ambiguous patterns elicit spontaneous alternations of phenomenal appearance. Attention is known to influence these phenomenal reversals, as do several other factors. We asked whether a shift of attention individually prompts each reversal of phenomenal appearance. By combining intermittent presentation with a proven method of attention control, we monitored phenomenal alternations in the c...

متن کامل

No Stopping and No Slowing: Removing Visual Attention with No Effect on Reversals of Phenomenal Appearance

We investigated whether visual selective attention contributes to the reversals of phenomenal appearance that characterize multi-stable displays. We employed a rotating-ring display that reverses appearance only at certain phases of its rotation (i.e., when in full-frontal view). During this critical window of time, observers were required to perform a shape discrimination task, thus diverting ...

متن کامل

Developmental Reversals in Attention

One of the law-like regularities of psychological science is that of developmental progression – an increase in sensorimotor, cognitive, and social functioning from childhood to adulthood. Here we report a rare violation of this law – a developmental reversal in attention. In Experiment 1, 4to-5-year-olds (N= 34) and adults (N = 35) performed a change detection task that included externally cue...

متن کامل

Conscious Perceptual Experience as Representational Self-Prompting

John Dilworth Western Michigan University [Journal of Mind and Behavior 28 no. 2 (2007), 135-156] The self-prompting theory of consciousness holds that conscious perceptual experience occurs when non-routine perceptual data prompt the activation of a plan in an executive control system that monitors perceptual input. On the other hand, routine, non-conscious perception merely provides data abou...

متن کامل

No Need to Pay Attention: Simple Recurrent Neural Networks Work!

First-order factoid question answering assumes that the question can be answered by a single fact in a knowledge base (KB). While this does not seem like a challenging task, many recent attempts that apply either complex linguistic reasoning or deep neural networks achieve 65%–76% accuracy on benchmark sets. Our approach formulates the task as two machine learning problems: detecting the entiti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Vision

سال: 2007

ISSN: 1534-7362

DOI: 10.1167/7.10.5